• word of the day

    west germany

    west germany - Dictionary definition and meaning for word west germany

    Definition
    (noun) a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990

Word used in video below:
text: Wild West what's that the wild west is
Download our Mobile App Today
Receive our word of the day
on Whatsapp